• Critical echo state network dynamics by means of Fisher information maximization 

      Bianchi, Filippo Maria; Livi, Lorenzo; Jenssen, Robert; Alippi, Cesare (Chapter; Bokkapittel, 2017-07-03)
      The computational capability of an Echo State Network (ESN), expressed in terms of low prediction error and high short-term memory capacity, is maximized on the so-called “edge of criticality”. In this paper we present a novel, unsupervised approach to identify this edge and, accordingly, we determine hyperparameters configuration that maximize network performance. The proposed method is ...
    • Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization 

      Bianchi, Filippo Maria; Livi, Lorenzo; Alippi, Cesare (Journal article; Tidsskriftartikkel; Peer reviewed, 2017-03)
      It is a widely accepted fact that the computational capability of recurrent neural networks (RNNs) is maximized on the so-called “edge of criticality.” Once the network operates in this configuration, it performs efficiently on a specific application both in terms of: 1) low prediction error and 2) high shortterm memory capacity. Since the behavior of recurrent networks is strongly influenced by the ...
    • Hierarchical Representation Learning in Graph Neural Networks with Node Decimation Pooling 

      Bianchi, Filippo Maria; Grattarola, Daniele; Livi, Lorenzo; Alippi, Cesare (Journal article; Tidsskriftartikkel; Peer reviewed, 2020-12-31)
      In graph neural networks (GNNs), pooling operators compute local summaries of input graphs to capture their global properties, and they are fundamental for building deep GNNs that learn hierarchical representations. In this work, we propose the Node Decimation Pooling (NDP), a pooling operator for GNNs that generates coarser graphs while preserving the overall graph topology. During training, the ...
    • Multiplex visibility graphs to investigate recurrent neural network dynamics 

      Bianchi, Filippo Maria; Livi, Lorenzo; Alippi, Cesare; Jenssen, Robert (Journal article; Tidsskriftartikkel; Peer reviewed, 2017-03-10)
      A recurrent neural network (RNN) is a universal approximator of dynamical systems, whose performance often depends on sensitive hyperparameters. Tuning them properly may be difficult and, typically, based on a trial-and-error approach. In this work, we adopt a graph-based framework to interpret and characterize internal dynamics of a class of RNNs called echo state networks (ESNs). We design principled ...
    • Scalable Spatiotemporal Graph Neural Networks 

      Cini, Andrea; Marisca, Ivan; Bianchi, Filippo Maria; Alippi, Cesare (Journal article; Tidsskriftartikkel, 2023)
      Neural forecasting of spatiotemporal time series drives both research and industrial innovation in several relevant application domains. Graph neural networks (GNNs) are often the core component of the forecasting architecture. However, in most spatiotemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph, ...
    • Spectral clustering with graph neural networks for graph pooling 

      Bianchi, Filippo Maria; Grattarola, Daniele; Alippi, Cesare (Conference object; Konferansebidrag, 2020)
      Spectral clustering (SC) is a popular clustering technique to find strongly connected communities on a graph. SC can be used in Graph Neural Networks (GNNs) to implement pooling operations that aggregate nodes belonging to the same cluster. However, the eigendecomposition of the Laplacian is expensive and, since clustering results are graph-specific, pooling methods based on SC must perform a ...
    • Understanding Pooling in Graph Neural Networks 

      Grattarola, Daniele; Zambon, Daniele; Bianchi, Filippo Maria; Alippi, Cesare (Journal article; Tidsskriftartikkel; Peer reviewed, 2022-07-21)
      Many recent works in the field of graph machine learning have introduced pooling operators to reduce the size of graphs. In this article, we present an operational framework to unify this vast and diverse literature by describing pooling operators as the combination of three functions: selection, reduction, and connection (SRC). We then introduce a taxonomy of pooling operators, based on some of ...